# Uzbek BERT
Bertbek News Big Cased
MIT
A pre-trained BERT model for Uzbek (12-layer, case-sensitive), trained on a large news corpus (Daryo)
Large Language Model
Transformers Other

B
elmurod1202
141
5
Uzbert Base Uncased
MIT
A pre-trained BERT model for Uzbek (Cyrillic script), trained with masked language modeling and next sentence prediction objectives.
Large Language Model
Transformers Other

U
coppercitylabs
82
8
Featured Recommended AI Models